Extenders for vector-valued functions

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Holomorphic vector-valued functions

exists. The function f is continuously differentiable when it is differentiable and f ′ is continuous. A k-times continuously differentiable function is C, and a continuous function is C. A V -valued function f is weakly C when for every λ ∈ V ∗ the scalar-valued function λ◦ f is C. This sense of weak differentiability of a function f does not refer to distributional derivatives, but to differe...

متن کامل

Tensorial approximate identities for vector valued functions

The rule of an approximate identity is as follows: A function f which has to be approximated is convolved with a kernel Kδ for some δ. If the kernel satisfies certain conditions then the convolutions converge in a certain limit such as δ → 0+ to f . Note that approximate identities for scalar function on balls in R are studied e.g. in [14]. Further works on localizing kernels, like scaling func...

متن کامل

Wiener Tauberian Theorems for Vector - Valued Functions

Different versions of Wiener’s Tauberian theorem are discussed for the generalized group algebra LI(G,A) (of integrable functions on a locally compact abelian group G taking values in a commutative semisimple regular Banach algebra A) using A-valued Fourier transforms. A weak form of Wiener’s Tauberian property is introduced and it is proved that LI(G,A) is weakly Tauberian if and only if A is....

متن کامل

Error Bounds for Vector-valued Functions: Necessary

In this paper, we attempt to extend the definition and existing local error bound criteria to vector-valued functions, or more generally, to functions taking values in a normed linear space. Some new derivative-like objects (slopes and subdifferentials) are introduced and a general classification scheme of error bound criteria is presented.

متن کامل

Kernels for Vector-Valued Functions: A Review

Kernel methods are among the most popular techniques in machine learning. From a frequentist/discriminative perspective they play a central role in regularization theory as they provide a natural choice for the hypotheses space and the regularization functional through the notion of reproducing kernel Hilbert spaces. From a Bayesian/generative perspective they are the key in the context of Gaus...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Studia Mathematica

سال: 2009

ISSN: 0039-3223,1730-6337

DOI: 10.4064/sm191-2-2